Zeroth-order methods for noisy Hölder-gradient functions

نویسندگان

چکیده

In this paper, we prove new complexity bounds for zeroth-order methods in non-convex optimization with inexact observations of the objective function values. We use Gaussian smoothing approach Nesterov and Spokoiny(Found Comput Math 17(2): 527–566, 2015. https://doi.org/10.1007/s10208-015-9296-2 ) extend their results, obtained smooth problems, to setting minimization functions Hölder-continuous gradient noisy oracle, obtaining noise upper-bounds as well. consider finite-difference approximation based on normally distributed random vectors that descent scheme converges stationary point smoothed function. also convergence original (not smoothed) obtain number steps algorithm making norm its small. Additionally provide level oracle which it is still possible guarantee above hold. separately case $$\nu = 1$$ show dependence dimension can be improved.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming

In this paper, we introduce a new stochastic approximation (SA) type algorithm, namely the randomized stochastic gradient (RSG) method, for solving an important class of nonlinear (possibly nonconvex) stochastic programming (SP) problems. We establish the complexity of this method for computing an approximate stationary point of a nonlinear programming problem. We also show that this method pos...

متن کامل

Determinants of Zeroth Order Operators

For compact Riemannian manifolds all of whose geodesics are closed (aka Zoll manifolds) one can define the determinant of a zeroth order pseudodifferential operator by mimicking Szego’s definition of this determinant for the operator: multiplication by a bounded function, on the Hilbert space of square-integrable functions on the circle. In this paper we prove that the non-local contribution to...

متن کامل

Gradient methods for minimizing composite functions

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure. Despite the absence of good properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for...

متن کامل

Stochastic Zeroth-order Optimization in High Dimensions

We consider the problem of optimizing a high-dimensional convex function using stochastic zeroth-order queries. Under sparsity assumptions on the gradients or function values, we present two algorithms: a successive component/feature selection algorithm and a noisy mirror descent algorithm using Lasso gradient estimates, and show that both algorithms have convergence rates that depend only loga...

متن کامل

Hölder estimates for Green's functions on convex polyhedral domains and their applications to finite element methods

A model second-order elliptic equation on a general convex polyhedral domain in three dimensions is considered. The aim of this paper is twofold: First sharp Hölder estimates for the corresponding Green’s function are obtained. As an applications of these estimates to finite element methods, we show the best approximation property of the error in W 1 ∞. In contrast to previously known results, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Optimization Letters

سال: 2021

ISSN: ['1862-4480', '1862-4472']

DOI: https://doi.org/10.1007/s11590-021-01742-z